翻訳と辞書
Words near each other
・ Human Life Amendment
・ Human Life International
・ Human Like a House
・ Human liquor
・ Human Longevity
・ Human Love
・ Human Lunar Return study
・ Human magnetism
・ Human mail
・ Human male sexuality
・ Human mammary tumor virus
・ Human mandible
・ Human Markup Language
・ Human mating strategies
・ Human maximisation test
Human Media Lab
・ Human medical experiments in the United States
・ Human memory process
・ Human Metabolome Database
・ Human metapneumovirus
・ Human Microbiome Project
・ Human microbiota
・ Human microphone
・ Human migration
・ Human milk bank
・ Human milk banking in North America
・ Human mission to an asteroid
・ Human mission to Mars
・ Human mitochondrial DNA haplogroup
・ Human mitochondrial genetics


Dictionary Lists
翻訳と辞書 辞書検索 [ 開発暫定版 ]
スポンサード リンク

Human Media Lab : ウィキペディア英語版
Human Media Lab


The (Human Media Lab (HML) ) is a research laboratory in Human-Computer Interaction at Queen's University's (School of Computing ) in Kingston, Ontario. Its goals are to advance user interface design by creating and empirically evaluating disruptive new user interface technologies, and educate graduate students in this process. The (Human Media Lab ) was founded in 2000 by Prof. Roel Vertegaal and employs an average of 12 graduate students.
The laboratory is well known for its pioneering work on flexible display interaction and paper computers, with systems such as PaperWindows (2004),〔 PaperPhone (2010)〔 and PaperTab (2012).〔 HML is also known for its invention of ubiquitous eye input, such as Samsung's Smart Pause and Smart Scroll〔 technologies.
==Research==
In 2003, researchers at the (Human Media Lab ) helped shape the paradigm Attentive User Interfaces,〔Vertegaal, R. (2003). Attentive User Interfaces. Editorial, In Special Issue on Attentive User Interfaces, Communications of ACM 46(3), ACM Press, 30-33.〕 demonstrating how groups of computers could use human social cues for considerate notification.〔Gibbs, W. (2005) Considerate Computing. Scientific American 292, 54 - 61〕 Amongst HML's early inventions was the eye contact sensor, first demonstrated to the public on ABC Good Morning America.〔Vertegaal, R., Dickie, C., Sohn, C. and Flickner, M. (2002). Designing attentive cell phone using wearable eyecontact sensors. In CHI '02 Extended Abstracts on Human Factors in Computing Systems. ACM Press, pp. 646-647.〕 Attentive User Interfaces developed at the time included an early iPhone prototype that used eye tracking electronic glasses to determine whether users were in a conversation,〔 an attentive television that play/paused contents upon looking away, mobile Smart Pause and Smart Scroll (adopted in Samsung's Galaxy S4)〔Dickie, C., Vertegaal, R., Sohn C., and Cheng, D. (2005). eyeLook: using attention to facilitate mobile media consumption. In Proceedings of the 18th annual ACM symposium on User interface software and technology (UIST '05). ACM Press, 103-106.〕 as well as a technique for calibration-free eye tracking by placing invisible infrared markers in the scene.
Current research at the (Human Media Lab ) focuses on the development of Organic User Interfaces: user interfaces with a non-flat display. In 2004, researchers at the HML built the first bendable paper computer, PaperWindows,〔Holman, D., Vertegaal, R. and Troje, N. (2005). PaperWindows: Interaction Techniques for Digital Paper. In Proceedings of ACM CHI 2005 Conference on Human Factors in Computing Systems. ACM Press, 591-599.〕 which premiered at CHI 2005. It featured multiple flexible, hires, colour, wireless, thin-film multitouch displays through real-time depth-cam 3D Spatial Augmented Reality. In May 2007 HML coined the term Organic User Interfaces.〔Vertegaal R., and Poupyrev, I. (2008). Introduction to Organic User Interfaces. In Special Issue on Organic User Interfaces, Communications of the ACM 51(6), 5-6.〕 Early Organic User Interfaces developed at HML included the first multitouch spherical display,〔Holman, D. and Vertegaal, R. (2008). Organic User Interfaces: Designing Computers in Any Way, Shape, or Form. In Special Issue on Organic User Interfaces, Communications of the ACM 51(6), 48-55.〕 and Dynacan, an interactive pop can: early examples of everyday computational things with interactive digital skins.〔Akaoka, E., Ginn, T. and R. Vertegaal. (2010). DisplayObjects: Prototyping Functional Physical Interfaces on 3D Styrofoam, Paper or Cardboard Models. In Proceedings of TEI’10 Conference on Tangible, Embedded and Embodied Interaction. ACM Press, 49-56.〕〔Vertegaal, R. (2011). The (Re)Usability of Everyday Computational Things. In ACM Interactions Magazine, ACM Press, Jan/Feb 2011, 39-41〕 In 2010, the (Human Media Lab ), with Arizona State University, developed the world's first functional flexible smartphone, PaperPhone. It pioneered bend interactions and was first shown to the public at ACM CHI 2011 in Vancouver.〔Lahey, B., Girouard, A., Burleson, W. and R. Vertegaal. (2011). PaperPhone: Understanding the Use of Bend Gestures in Mobile Devices with Flexible Electronic Paper Displays. In Proceedings of ACM CHI’11 Conference on Human Factors in Computing Systems, ACM Press, 1303-1312.〕 In 2012, the (Human Media Lab ) introduced the world's first pseudo-holographic, live size 3D video conferencing system,〔Kingsley, J. with will.i.am. (2013). Use Your Illusion. Wired UK, August 2013, 140-141.〕 TeleHuman.〔Kim, K., Bolton, J., Girouard, A., Cooperstock, J. and Vertegaal, R. (2012). TeleHuman: Effects of 3D Perspective on Gaze and Pose Estimation with a Life-size Cylindrical Telepresence Pod. In Proceedings of CHI’12 Conference on Human Factors in Computing Systems, ACM Press, 2531-2540.〕 In 2013, HML researchers unveiled PaperTab,〔Warner, B. (2013). PaperTab a Fold-Up, Roll-Up Tablet Computer. Bloomberg Businessweek, May 2013.〕 the world's first flexible tablet PC, at CES 2013 in Las Vegas, in collaboration with Plastic Logic and Intel.

抄文引用元・出典: フリー百科事典『 ウィキペディア(Wikipedia)
ウィキペディアで「Human Media Lab」の詳細全文を読む



スポンサード リンク
翻訳と辞書 : 翻訳のためのインターネットリソース

Copyright(C) kotoba.ne.jp 1997-2016. All Rights Reserved.